# Chinese Natural Language Processing
Sbert Chinese General V1
Apache-2.0
A general-purpose Chinese sentence embedding model for calculating sentence similarity and semantic search tasks.
Text Embedding
Transformers Chinese

S
DMetaSoul
388
6
Rbt4
Apache-2.0
This is a Chinese pretrained BERT model using whole word masking strategy, released by the Harbin Institute of Technology-iFLYTEK Joint Laboratory to accelerate Chinese natural language processing research.
Large Language Model Chinese
R
hfl
22
6
Chinese Bert Wwm Ext
Apache-2.0
A Chinese pre-trained BERT model employing whole word masking strategy, aimed at accelerating Chinese natural language processing research.
Large Language Model Chinese
C
hfl
24.49k
174
Rbt6
Apache-2.0
This is a retrained 6-layer RoBERTa-wwm-ext model using whole word masking technique for Chinese pretraining.
Large Language Model Chinese
R
hfl
796
9
Chinese Xlnet Base
Apache-2.0
An XLNet pre-trained model for Chinese, aimed at enriching Chinese natural language processing resources and providing diversified choices for Chinese pre-trained models.
Large Language Model
Transformers Chinese

C
hfl
1,149
31
Rbtl3
Apache-2.0
This is a retrained three-layer RoBERTa-wwm-ext-large model, a Chinese pre-trained BERT model employing whole word masking strategy, aimed at accelerating the development of Chinese natural language processing.
Large Language Model Chinese
R
hfl
767
4
Rbt3
Apache-2.0
This is a Chinese pre-trained BERT model employing whole word masking technology, developed by the HIT-iFLYTEK Joint Lab to accelerate advancements in Chinese natural language processing.
Large Language Model Chinese
R
hfl
6,626
35
Featured Recommended AI Models